First-order Methods for Geodesically Convex Optimization

نویسندگان

  • Hongyi Zhang
  • Suvrit Sra
چکیده

Geodesic convexity generalizes the notion of (vector space) convexity to nonlinear metric spaces. But unlike convex optimization, geodesically convex (g-convex) optimization is much less developed. In this paper we contribute to the understanding of g-convex optimization by developing iteration complexity analysis for several first-order algorithms on Hadamard manifolds. Specifically, we prove upper bounds for the global complexity of deterministic and stochastic (sub)gradient methods for optimizing smooth and nonsmooth g-convex functions, both with and without strong g-convexity. Our analysis also reveals how the manifold geometry, especially sectional curvature, impacts convergence rates. To the best of our knowledge, our work is the first to provide global complexity analysis for first-order algorithms for general g-convex optimization.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Frank-Wolfe methods for geodesically convex optimization with application to the matrix geometric mean

We consider optimization of geodesically convex objectives over geodesically convex subsets of the manifold of positive definite matrices. In particular, for this task we develop Euclidean and Riemannian Frank-Wolfe (FW) algorithms. For both settings we analyze non-asymptotic convergence rates to global optimality. To our knowledge, these are the first results on Riemannian FW and its convergen...

متن کامل

Operator Scaling via Geodesically Convex Optimization, Invariant Theory and Polynomial Identity Testing

We propose a new second-order method for geodesically convex optimization on the natural hyperbolic metric over positive de nite matrices. We apply it to solve the operator scaling problem in time polynomial in the input size and logarithmic in the error. This is an exponential improvement over previous algorithms which were analyzed in the usual Euclidean, commutative metric (for which the abo...

متن کامل

An optimization problem on the sphere

We prove existence and uniqueness of the minimizer for the average geodesic distance to the points of a geodesically convex set on the sphere. This implies a corresponding existence and uniqueness result for an optimal algorithm for halfspace learning, when data and target functions are drawn from the uniform distribution.

متن کامل

Riemannian SVRG: Fast Stochastic Optimization on Riemannian Manifolds

We study optimization of finite sums of geodesically smooth functions on Riemannian manifolds. Although variance reduction techniques for optimizing finite-sums have witnessed tremendous attention in the recent years, existing work is limited to vector space problems. We introduce Riemannian SVRG (RSVRG), a new variance reduced Riemannian optimization method. We analyze RSVRG for both geodesica...

متن کامل

Conic Geometric Optimization on the Manifold of Positive Definite Matrices

We develop geometric optimisation on the manifold of hermitian positive definite (hpd) matrices. In particular, we consider optimising two types of cost functions: (i) geodesically convex (g-convex); and (ii) log-nonexpansive (LN). G-convex functions are nonconvex in the usual euclidean sense, but convex along the manifold and thus allow global optimisation. LN functions may fail to be even g-c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016